10. Personalized and Focused Web Spiders
نویسندگان
چکیده
As the size of the Web continues to grow, searching it for useful information has become increasingly difficult. Researchers have studied different ways to search the Web automatically using programs that have been known as spiders, crawlers, Web robots, Web agents, Webbots, etc. In this chapter, we will review research in this area, present two case studies, and suggest some future research directions.
منابع مشابه
Focused Crawling Techniques
The need for more and more specific reply to a web search query has prompted researchers to work on focused web crawling techniques for web spiders. Variety of lexical and link based approaches of focused web crawling are introduced in the paper highlighting important aspects of each. General Terms Focused Web Crawling, Algorithms, Crawling Techniques.
متن کاملTowards an Effective Personalized Information Filter for P2P Based Focused Web Crawling
Information access is one of the hottest topics of information society, which has become even more important since the advent of the Web, but nowadays the general Web search engines still have no ability to find correct and timely information for individuals. In this paper, we propose a Peerto-Peer (P2P) based decentralized focused Web crawling system called PeerBridge to provide usercentered, ...
متن کاملMulti-modal Services for Web Information Collection Based on Multi-agent Techniques
With the rapid information growth on the Internet, web information collection is becoming increasingly important in many web applications, especially in search engines. The performance of web information collectors has a great influence on the quality of search engines, so when it comes to web spiders, we usually focus on their speed and accuracy. In this paper, we point out that customizabilit...
متن کاملHybrid Adaptive Educational Hypermedia Recommender Accommodating User’s Learning Style and Web Page Features
Personalized recommenders have proved to be of use as a solution to reduce the information overload problem. Especially in Adaptive Hypermedia System, a recommender is the main module that delivers suitable learning objects to learners. Recommenders suffer from the cold-start and the sparsity problems. Furthermore, obtaining learner’s preferences is cumbersome. Most studies have only focused...
متن کاملExpanding Reinforcement Learning Approaches for Efficient Crawling the Web
The amount of accessible information on World Wide Web is increasing rapidly, so that a general-purpose search engine cannot index everything on the Web. Focused crawlers have been proposed as a potential approach to overcome the coverage problem of search engines by limiting the domain of concentration of them. Focused crawling is a technique which is able to crawl particular topical portions ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003